Fast Solution of `1-norm Minimization Problems When the Solution May be Sparse

نویسندگان

  • David L. Donoho
  • Yaakov Tsaig
  • Michael Lustig
چکیده

The minimum `1-norm solution to an underdetermined system of linear equations y = Ax, is often, remarkably, also the sparsest solution to that system. This sparsity-seeking property is of interest in signal processing and information transmission. However, general-purpose optimizers are much too slow for `1 minimization in many large-scale applications. The Homotopy method was originally proposed by Osborne et al. for solving noisy overdetermined `1-penalized least squares problems. We here apply it to solve the noiseless underdetermined `1-minimization problem min ‖x‖1 subject to y = Ax. We show that Homotopy runs much more rapidly than general-purpose LP solvers when sufficient sparsity is present. Indeed, the method often has the following k-step solution property: if the underlying solution has only k nonzeros, the Homotopy method reaches that solution in only k iterative steps. When this property holds and k is small compared to the problem size, this means that `1 minimization problems with k-sparse solutions can be solved in a fraction of the cost of solving one full-sized linear system. We demonstrate this k-step solution property for two kinds of problem suites. First, incoherent matrices A, where off-diagonal entries of the Gram matrix AA are all smaller than M . If y is a linear combination of at most k ≤ (M−1 + 1)/2 columns of A, we show that Homotopy has the k-step solution property. Second, ensembles of d × n random matrices A. If A has iid Gaussian entries, then, when y is a linear combination of at most k < d/(2 log(n)) · (1 − n) columns, with n > 0 small, Homotopy again exhibits the k-step solution property with high probability. Further, we give evidence showing that for ensembles of d×n partial orthogonal matrices, including partial Fourier matrices, and partial Hadamard matrices, with high probability, the k-step solution property holds up to a dramatically higher threshold k, satisfying k/d < ρ̂(d/n), for a certain empirically-determined function ρ̂(δ). Our results imply that Homotopy can efficiently solve some very ambitious large-scale problems arising in stylized applications of error-correcting codes, magnetic resonance imaging, and NMR spectroscopy. Our approach also sheds light on the evident parallelism in results on `1 minimization and Orthogonal Matching Pursuit (OMP), and aids in explaining the inherent relations between Homotopy, LARS, OMP, and Polytope Faces Pursuit.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Conjugate gradient acceleration of iteratively re-weighted least squares methods

Iteratively Re-weighted Least Squares (IRLS) is a method for solving minimization problems involving non-quadratic cost functions, perhaps non-convex and non-smooth, which however can be described as the infimum over a family of quadratic functions. This transformation suggests an algorithmic scheme that solves a sequence of quadratic problems to be tackled efficiently by tools of numerical lin...

متن کامل

Face Recognition in Thermal Images based on Sparse Classifier

Despite recent advances in face recognition systems, they suffer from serious problems because of the extensive types of changes in human face (changes like light, glasses, head tilt, different emotional modes). Each one of these factors can significantly reduce the face recognition accuracy. Several methods have been proposed by researchers to overcome these problems. Nonetheless, in recent ye...

متن کامل

Fast Solution of l1-Norm Minimization Problems When the Solution May Be Sparse

The minimum `1-norm solution to an underdetermined system of linear equations y = Ax, is often, remarkably, also the sparsest solution to that system. This sparsity-seeking property is of interest in signal processing and information transmission. However, general-purpose optimizers are much too slow for `1 minimization in many large-scale applications. The Homotopy method was originally propos...

متن کامل

Sparse solutions of linear complementarity problems

This paper considers the characterization and computation of sparse solutions and leastp-norm (0 < p < 1) solutions of the linear complementarity problems LCP(q,M). We show that the number of non-zero entries of any least-p-norm solution of the LCP(q,M) is less than or equal to the rank of M for any arbitrary matrix M and any number p ∈ (0, 1), and there is p̄ ∈ (0, 1) such that all least-p-norm...

متن کامل

Linearized Bregman iterations for compressed sensing

Finding a solution of a linear equation Au = f with various minimization properties arises from many applications. One of such applications is compressed sensing, where an efficient and robust-to-noise algorithm to find a minimal `1 norm solution is needed. This means that the algorithm should be tailored for large scale and completely dense matrices A, while Au and A u can be computed by fast ...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2006